Entropy of a random variable

Consider RV F=f1,f2,...,fKF={f_1,f_2,...,f_K}, with probability pk=Prob.{F=fK}p_k=\mathrm{Prob.}\{F= f_K\}

Self-Information of one realization fk:Hk=log(pk)f_k : H_k= -\log(p_k)

Entropy = average information H()=f𝒜p(f)log2p(f)H(\mathcal{F})=-\sum_{f \in \mathcal{A}} p_{\mathcal{F}}(f) \log_2 p_{\mathcal{F}}(f)